11 research outputs found

    Towards improving emotion self-report collection using self-reflection

    Get PDF
    In an Experience Sampling Method (ESM) based emotion self-report collection study, engaging participants for a long period is challenging due to the repetitiveness of answering self-report probes. This often impacts the self-report collection as participants dropout in between or respond with arbitrary responses. Self-reflection (or commonly known as analyzing past activities to operate more efficiently in the future) has been effectively used to engage participants in logging physical, behavioral, or psychological data for Quantified Self (QS) studies. This motivates us to apply self-reflection to improve the emotion self-report collection procedure. We design, develop, and deploy a self-reflection interface and augment it with a smartphone keyboard-based emotion self-report collection application. The interface

    Designing an experience sampling method for smartphone based emotion detection

    Get PDF
    Smartphones provide the capability to perform in-situ sampling of human behavior using Experience Sampling Method (ESM). Designing an ESM schedule involves probing the user repeatedly at suitable moments to collect self-reports. Timely probe generation to collect high fidelity user responses while keeping probing rate low is challenging. In mobile-based ESM, timeliness of the probe is also impacted by user's availability to respond to self-report request. Thus,

    Smart-phone based spatio-temporal sensing for annotated transit map generation

    Get PDF
    City transit maps are one of the important resources for public navigation in today's digital world. However, the availability of transit maps for many developing countries is very limited, primarily due to the various socio-economic factors that drive the private operated and partially regulated transport services. Public transports at these cities are marred with many factors such as uncoordinated waiting time at bus stoppages, crowding in the bus, sporadic road conditions etc., which also need to be annotated so that commuters can take informed decision. Interestingly, many of these factors are spatio-temporal in nature. In this paper, we develop CityMap, a system to automatically extract transit routes along with their eccentricities from spatio-temporal crowdsensed data collected via commuters' smart-phones. We apply a learning based methodology coupled with a feature selection mechanism to filter out the necessary information from raw smart-phone sensor data with minimal user engagement and drain of batt

    Impact of experience sampling methods on tap pattern based emotion recognition

    Get PDF
    Smartphone based emotion recognition uses predictive modeling to recognize user's mental states. In predictive modeling, determining ground truth plays a crucial role in labeling and training the model. Experience Sampling Method (ESM) is widely used in behavioral science to gather user responses about mental states. Smartphones equipped with sensors provide new avenues to design Experience Sampling Methods. Sensors provide multiple contexts that can be used to trigger collection of user responses. However, subsampling of sensor data can bias the inference drawn from trigger based ESM. We investigate whether continuous sensor data simplify the design of ESM. We use the typing pattern of users on smartphone as the context that can trigger response collection. We compare the context based and time based ESM designs to determine the impact of ESM strategies on emotion modeling. The results indicate how different ESM designs compare against each other

    Emotion detection from touch interactions during text entry on smartphones

    Get PDF
    There are different modes of interaction with a software keyboard on a smartphone, such as typing and swyping. Patterns of such touch interactions on a keyboard may reflect emotions of a user. Since users may switch between different touch modalities while using a keyboard, therefore, automatic detection of emotion from touch patterns must consider both modalities in combination to detect the pattern. In this paper, we focus on identifying different features of touch interactions with a smartphone keyboard that lead to a personalized model for inferring user emotion. Since distinguishing typing and swyping activity is important to record the correct features, we designed a technique to correctly identify the modality. The ground truth labels for user emotion are collected directly from the user by periodically collecting self-reports. We jointly model typing and swyping features and correlate them with user provided self-reports to build a personalized machine learning model, which detects four emotion states (happy, sad, stressed, relaxed). We combine these design choices into an Android application TouchSense and evaluate the same in a 3-week in-the-wild study involving 22 participants. Our key evaluation results and post-study participant assessment demonstrate that it is possible to predict these emotion states with an average accuracy (AUCROC) of 73% (std dev. 6%, maximum 87%) combining these two touch interactions only

    Does emotion influence the use of auto-suggest during smartphone typing?

    Get PDF
    Typing based interfaces are common across many mobile applications, especially messaging apps. To reduce the difficulty of typing using keyboard applications on smartphones, smartwatches with restricted space, several techniques, such as auto-complete, auto-suggest, are implemented. Although helpful, these techniques do add more cognitive load on the user. Hence beyond the importance to improve the word recommendations, it is useful to understand the pattern of use of auto-suggestions during typing. Among several factors that may influence use of auto-suggest, the role of emotion has been mostly overlooked, often due to the difficulty of unobtrusively inferring emotion. With advances in affective computing, and ability to infer user's emotional states accurately, it is imperative to investigate how auto-suggest can be guided by emotion aware decisions. In this work, we investigate correlations between user emotion and usage of auto-suggest i.e. whether users prefer to use auto-suggest in specific emotion states. We developed an Android keyboard application, which records auto-suggest usage and collects emotion self-reports from users in a 3-week in-the-wild study. Analysis of the dataset reveals relationship between user reported emotion state and use of auto-suggest. We used the data to train personalized models for predicting use of auto-suggest in specific emotion state. The model can predict use of auto-suggest with an average accuracy (AUCROC) of 82% showing the feasibility of emotion-aware auto-suggestion

    ComfRide: A smartphone based system for comfortable public transport recommendation

    Get PDF
    Passenger comfort is a major factor influencing a commuter's decision to avail public transport. Existing studies suggest that factors like overcrowding, jerkiness, traffic congestion etc. correlate well to passenger's (dis)comfort. An online survey conducted with more than 300 participants from 12 different countries reveals that different personalized and context dependent factors influence passenger comfort during a travel by public transport. Leveraging on these findings, we identify correlations between comfort level and these dynamic parameters, and implement a smartphone based application, ComfRide, which recommends t

    Unsupervised annotated city traffic map generation

    Get PDF
    Public bus services in many cities in countries like India are controlled by private owners, hence, building up a database for all the bus routes is non-trivial. In this paper, we leverage smart-phone based sensing to crowdsource and populate the information repository for bus routes in a city. We have developed an intelligent data logging module for smartphones and a server side processing mechanism to extract roads and bus routes information. From a 3 month long study involving more than 30 volunteers in 3 different cities in India, we found that the developed system, CrowdMap, can annotate bus routes wit

    Exploring smartphone keyboard interactions for Experience Sampling Method driven probe generation

    Get PDF
    Keyboard interaction patterns on a smartphone is the input for many intelligent emotion-aware applications, such as adaptive interface, optimized keyboard layout, automatic emoji recommendation in IM applications. The simplest approach, called the Experience Sampling Method (ESM), is to systematically gather self-reported emotion labels from users, which act as the ground truth labels, and build a supervised prediction model for emotion inference. However, as manual self-reporting is fatigue-inducing and attention-demanding, the self-report requests are to be scheduled at favorable moments to ensure high fidelity response. We, in this paper, perform fine-grain keyboard interaction analysis to determine suitable probing moments. Keyboard interaction patterns, both cadence, and latency between strokes, nicely translate to frequency and time domain analysis of the patterns. In this paper, we perform a 3-week in-the-wild study (N = 22) to log keyboard interaction patterns and self-report details indicating (in)opportune probing moments. Analysis of the dataset reveals that time-domain features (e.g., session length, session duration) and frequency-domain features (e.g., number of peak amplitudes, value of peak amplitude) vary significantly between opportune and inopportune probing moments. Driven by these analyses, we develop a generalized (all-user) Random Forest based model, which can identify the opportune probing moments with an average F-score of 93%. We also carry out the explainability analysis of the model using SHAP (SHapley Additive exPlanations), which reveals that the session length and peak amplitude have strongest influence to determine the probing moments

    Detecting mobility context over smartphones using typing and smartphone engagement patterns

    Get PDF
    Most of the latest context-based applications capture the mobility of a user using Inertial Measurement Unit (IMU) sensors like accelerometer and gyroscope which do not need explicit user-permission for application access. Although these sensors provide highly accurate mobility context information, existing studies have shown that they can lead to undesirable leakage of location information. To evade this breach of location privacy, many of the state-of-the-art studies suggest to impose stringent restrictions over the usage of IMU sensors. However, in this paper, we show that typing and smartphone engagement patterns can act as an alternative modality to sniff the mobility context of a user, even if the IMU sensors are not sampled at all. We develop an adversarial framework, named ConType, which exploits the signatures exposed by typing and smartphone engagement patterns to track the mobility of a user. Rigorous experiments with in-the-wild dataset show that ConType can track the mobility contexts with an average micro-F1 of 0.87 (±0.09), without using IMU data. Through additional experiments, we also show that ConType can track mobility stealthily with very low power and resource footprints, thus further aggravating the risk
    corecore